ELEN6887 Lecture 7: PAC bounds and Concentration of Measure

نویسنده

  • R. Castro
چکیده

2 Agnostic Learning We will proceed without making any assumptions on the distribution PXY . This situation is often termed as Agnostic Learning. The root of the word agnostic literally means not known. The term agnostic learning is used to emphasize the fact that often, perhaps usually, we may have no prior knowledge about PXY and f∗. The question then arises about how we can reasonably select an f ∈ F in this setting.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ECE 901 Lecture 7: PAC bounds and Concentration of Measure

2 Agnostic Learning We will proceed without making any assumptions on the distribution PXY . This situation is often termed as Agnostic Learning. The root of the word agnostic literally means not known. The term agnostic learning is used to emphasize the fact that often, perhaps usually, we may have no prior knowledge about PXY and f∗. The question then arises about how we can reasonably select...

متن کامل

PAC-Bayesian Generalization Bound on Confusion Matrix for Multi-Class Classification

In this work, we propose a PAC-Bayes bound for the generalization risk of the Gibbs classifier in the multi-class classification framework. The novelty of our work is the critical use of the confusion matrix of a classifier as an error measure; this puts our contribution in the line of work aiming at dealing with performance measure that are richer than mere scalar criterion such as the misclas...

متن کامل

Risk Bounds for Levy Processes in the PAC-Learning Framework

Lévy processes play an important role in the stochastic process theory. However, since samples are non-i.i.d., statistical learning results based on the i.i.d. scenarios cannot be utilized to study the risk bounds for Lévy processes. In this paper, we present risk bounds for non-i.i.d. samples drawn from Lévy processes in the PAC-learning framework. In particular, by using a concentration inequ...

متن کامل

67577 – Intro . to Machine Learning Fall semester , 2008 / 9

The result of the PAC model (also known as the ”formal” learning model) is that if the concept class C is PAC-learnable then the learning strategy must simply consist of gathering a sufficiently large training sample S of size m > mo( , δ), for given accuracy > 0 and confidence 0 < δ < 1 parameters, and finds a hypothesis h ∈ C which is consistent with S. The learning algorithm is then guarante...

متن کامل

ELEN6887 Lecture 14: Denoising Smooth Functions with Unknown Smoothness

Lipschitz functions are interesting, but can be very rough (these can have many kinks). In many situations the functions can be much smoother. This is how you would model the temperature inside a museum room for example. Often we don’t know how smooth the function might be, so an interesting question is if we can adapt to the unknown smoothness. In this lecture we will use the Maximum Complexit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009